Definitions
from The American Heritage® Dictionary of the English Language, 5th Edition.
- noun In information theory, a mathematical measure of the degree of randomness in a set of data, with greater randomness implying higher entropy and greater predictability implying lower entropy.
from Wiktionary, Creative Commons Attribution/Share-Alike License.
- noun information theory A
measure of theuncertainty associated with arandom variable ; a measure of theaverage information content one ismissing when one does not know thevalue of the random variable (usually in units such asbits ); the amount of information (measured in, say, bits) contained per average instance of a character in a stream of characters.
Etymologies
Sorry, no etymologies found.
Support
Help support Wordnik (and make this page ad-free) by adopting the word information entropy.
Examples
Sorry, no example sentences found.
Comments
Log in or sign up to get involved in the conversation. It's quick and easy.